The Stochastic Fejér-Monotone Hybrid Steepest Descent Method and the Hierarchical RLS
نویسندگان
چکیده
منابع مشابه
Efficient Parallel Computation of the Stochastic MV-PURE Estimator by the Hybrid Steepest Descent Method
In this paper we consider the problem of efficient computation of the stochastic MV-PURE estimator which is a reduced-rank estimator designed for robust linear estimation in ill-conditioned inverse problems. Our motivation for this result stems from the fact that the reduced-rank estimation by the stochastic MV-PURE estimator, while avoiding the problem of regularization parameter selection app...
متن کاملHybrid steepest-descent method with sequential and functional errors in Banach space
Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...
متن کاملHybrid Steepest Descent Method for Solving Hierarchical Fixed Point Approach to Variational Inequalities Constrained Optimization Problem
An explicit hierarchical fixed point algorithm is introduced to solve the monotone variational inequality over the fixed point set of a nonexpansive mapping. This paper discusses a monotone variational inequality with variational constraint and convex optimization problems over the fixed point set of a nonexpansive mapping. The strong convergence for the proposed algorithm to the solution is gu...
متن کاملOn the Steepest Descent Method for Matrix
We consider the special case of the restarted Arnoldi method for approximating the product of a function of a Hermitian matrix with a vector which results when the restart length is set to one. When applied to the solution of a linear system of equations, this approach coincides with the method of steepest descent. We show that the method is equivalent with an interpolation process in which the...
متن کاملA hybrid steepest descent method for constrained convex optimization
This paper describes a hybrid steepest descent method to decrease over time any given convex cost function while keeping the optimization variables into any given convex set. The method takes advantage of properties of hybrid systems to avoid the computation of projections or of a dual optimum. The convergence to a global optimum is analyzed using Lyapunov stability arguments. A discretized imp...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2019
ISSN: 1053-587X,1941-0476
DOI: 10.1109/tsp.2019.2907257